Skip to content

Comments

fix: Guard -1 sentinel in manual process endpoint (#160)#163

Merged
deucebucket merged 2 commits intodevelopfrom
fix/issue-160-sentinel-guard
Feb 19, 2026
Merged

fix: Guard -1 sentinel in manual process endpoint (#160)#163
deucebucket merged 2 commits intodevelopfrom
fix/issue-160-sentinel-guard

Conversation

@deucebucket
Copy link
Owner

Summary

Addresses vibe-check review findings on PR #162. The -1 rate-limit sentinel from process_queue() leaks into the /api/process endpoint (manual "Process Batch" button), causing {processed: -1, verified: -1} in the API response when the user is rate-limited.

  • Guard total_processed += max(0, l2_processed) in app.py:7782
  • Update process_queue docstring to document the -1 return value

Test plan

  • Trigger manual process batch while rate-limited, verify response shows processed: 0 not -1
  • Normal processing still returns correct counts

Vibe-check caught that the -1 rate-limit sentinel from
process_queue leaks into the /api/process endpoint, causing
{processed: -1, verified: -1} in the API response. Guard with
max(0, l2_processed). Also update process_queue docstring to
document the -1 return value.
Copy link

@bucket-agent bucket-agent bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🔍 Vibe Check Review

Context

This PR adds a max(0, l2_processed) guard in api_process (the manual /api/process endpoint) to prevent the -1 rate-limit sentinel introduced by PR #162 from corrupting total_processed, plus a docstring update documenting that contract.

Codebase Patterns I Verified

  • Sentinel contract established in PR #162: process_queue was changed from return 0, 0return -1, 0 on rate limit (SHA bae9c707b86ec3). PR #163 is built on top of that change.
  • Worker.py already guarded (PR #162): layer 4 loop in worker.py:387–393 already handles processed == -1 with a 30s sleep + continue — that path is covered.
  • Only one other call site in app.py: process_queue is called at app.py:7684 (the one being fixed). No other unguarded callers in app.py. Confirmed via grep.
  • l2_fixed is always 0 on rate-limit: process_queue returns (-1, 0) — second element is always 0 — so total_fixed += l2_fixed is safe without a guard.
  • Downstream verified = processed - fixed at app.py:7697 would have produced -1 if rate-limited without this fix. Now it stays at 0.

✅ Good

  • Minimal, surgical fix — exactly the right scope for a follow-on sentinel guard.
  • max(0, l2_processed) is the correct idiom: no special-case branching, works for any negative sentinel value, doesn't mask future bugs (a -2 would also be clamped rather than corrupting the total).
  • Docstring update accurately documents the contract, which is the right place to record it given the function signature doesn't change.
  • Comment # Issue #160: creates a traceable link to the root bug.

🚨 Issues Found

No issues found.

📋 Scope Verification

Issue Problem Addressed? Notes
#160 Rate-limited batches falsely trigger exhaustion Core fix was PR #162; this PR closes the remaining gap in the manual endpoint call path
#162 (merged) Sentinel introduced by this fix needed guarding in api_process Exactly what this PR does

Scope Status: SCOPE_OK

📝 Documentation Check

This is a fix: PR. No CHANGELOG entry or version bump is included. Given this is a pure defensive guard with zero user-visible behavior change (rate-limiting was already handled in the worker; the manual endpoint just returned a slightly wrong count that nobody would catch), skipping a version bump is defensible. If the project pattern is "every fix gets a bump," that's LOW and doesn't block merge.

  • CHANGELOG.md: ⚠️ Not updated (minor, acceptable for a follow-on patch to an already-documented fix)
  • README.md: N/A

🎯 Verdict

APPROVE

Clean, focused, correct. The max(0, l2_processed) guard is the right fix for the right call site. No regressions, no scope creep, no security or correctness issues.

@deucebucket deucebucket merged commit 8730b71 into develop Feb 19, 2026
3 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant